Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Background: Medical procedure training often requires constant feedback and different educational interventions. Analyzing gestures within the context of medical procedure training helps trainees better understand critical maneuvers that ensure the successful completion of a procedure. Most gesture feedback involves an instructor suggesting an alteration of the trainee's form or position. This type of feedback is often difficult to convey within telehealth procedure education. For example, remote training of medical procedures is difficult for trainees when they do not have the same type of in‐person interaction with the instructor. These challenges exist in various scenarios such as online physical exam education for medical students or medical procedure training for rural/disaster/wilderness scenarios. Since few tools exist to overcome this challenge, we developed a software program that uses data processing and OpenPose to quantify gestures to help remote trainees learn new procedural skills. Methods: Novice healthcare providers were recorded during an ultrasound‐guided central venous catheterization (US‐CVC) training session. Each trainee was paired with one physician instructor, who modeled and helped assist with completing the procedure. For this feasibility study, various gestures throughout the training were analyzed using video data to identify which gestures might be especially useful for completing the procedure. With our software, a single frame capturing the precise moment at which each of the individuals physically placed the central line needle into the mannequin was then processed. Keypoint data from both arms were further processed to identify critical angles for the insertion of the syringe. Both, left and right, arm angles of the trainees were then compared to the instructor's respective angles to assess whether trainees were mirroring the instructor's gestures, indicating successful procedure completion. Results: 7 trainees and 6 instructors were analyzed from the cohort consisting of 10 trainees and 10 instructors. A total of 13 frames were processed by the OpenPose algorithm and a total of 325 keypoints (25 keypoints per individual) were collected. The instructor's left arm angle was positioned at 163.1 degrees (SD = 9.7), while holding the ultrasound probe and their right arm angle was 109.9 degrees, while holding the syringe. The mean of the trainee's left arm angles was 160.9 degrees (SD = 12.7) and the mean of the trainee's right arm was 102.1 degrees (SD = 18.4). For the left arm, the mean difference between trainee and teacher was 2.24 ± 20.31 degrees, (95% CI ‐16.54 to 21.03 degrees), p = .78. For the right arm, mean difference was 7.84 ± 14.88 (95% CI ‐5.92 to 21.60), p = .21. Since each trainee was matched to a particular teacher as that trainee's gold standard, we used 2‐tailed paired t‐tests to examine differences between trainee and teacher angles for each arm. In this pilot data, the trainees' arm angles did not differ significantly from their teachers' angles. Discussion: This study's results suggest that trainees had similar arm angles to the instructor. The significance of these findings suggests that there is a way to quantitatively measure if a trainee successfully completes a procedure through a video. Assessing whether trainees effectively perform the procedure is challenging, especially from a 2‐D video. Yet, some of these limitations may be overcome with quantitative gestural analysis. Remote medical procedure training stands to benefit from this form of feedback as it is often difficult to convey to trainees how to alter their position over video conferencing alone. Instructors can suggest a change in the trainee's gestures with real‐time data, allowing the trainee to adjust and successfully complete the procedure. Our findings illuminate the utility of quantitative gesture analysis to overcome the challenges of communicating qualitative gestures and help trainees learn new procedures and maneuvers through telehealth‐related video platforms.more » « less
-
Abstract We combine new data from the Karl G. Jansky Very Large Array with previous radio observations to create a more complete picture of the ongoing interactions between the radio jet from galaxy NGC 541 and the star-forming system known as Minkowski’s Object (MO). We then compare those observations with synthetic radio data generated from a new set of magnetohydrodynamic simulations of jet–cloud interactions specifically tailored to the parameters of MO. The combination of radio intensity, polarization, and spectral index measurements all convincingly support the interaction scenario and provide additional constraints on the local dynamical state of the intracluster medium and the time since the jet–cloud interaction first began. In particular, we show that only a simulation with a bent radio jet can reproduce the observations.more » « less
-
null (Ed.)The COVID-19 pandemic brought to the forefront an unprecedented need for experts, as well as citizens, to visualize spatio-temporal disease surveillance data. Web application dashboards were quickly developed to fill this gap, including those built by JHU, WHO, and CDC, but all of these dashboards supported a particular niche view of the pandemic (ie, current status or specific regions). In this paper1, we describe our work developing our own COVID-19 Surveillance Dashboard, available at https://nssac.bii.virginia.edu/covid19/dashboard/, which offers a universal view of the pandemic while also allowing users to focus on the details that interest them. From the beginning, our goal was to provide a simple visual way to compare, organize, and track near-real-time surveillance data as the pandemic progresses. Our dashboard includes a number of advanced features for zooming, filtering, categorizing and visualizing multiple time series on a single canvas. In developing this dashboard, we have also identified 6 key metrics we call the 6Cs standard which we propose as a standard for the design and evaluation of real-time epidemic science dashboards. Our dashboard was one of the first released to the public, and remains one of the most visited and highly used. Our group uses it to support federal, state and local public health authorities, and it is used by people worldwide to track the pandemic evolution, build their own dashboards, and support their organizations as they plan their responses to the pandemic. We illustrate the utility of our dashboard by describing how it can be used to support data story-telling – an important emerging area in data science.more » « less
An official website of the United States government

Full Text Available